Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Research Paper Pretraining
# Research Paper Pretraining
Scholarbert
Apache-2.0
BERT-large variant pretrained on large-scale scientific paper collections with 340 million parameters, specializing in scientific literature comprehension
Large Language Model
Transformers
English
S
globuslabs
25
9
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase